Web Survey Bibliography
We used the Web 2.0 internet-application “Twitter” as a platform for formative evaluation in two courses (pilot and experimental study). After each lesson, students answered evaluation questions via Web-browser, SMS, or Instant Messenger. Both courses were also evaluated summatively, online and offline. The offline summative evaluation took place in the last lecture session. The online summative evaluation was carried out by the evaluation unit of the University of Vienna one week after the offline evaluation.
The aim of our research was to find out if Twitter would be a useful instrument for formative course evaluation. We also wanted to verify if the formative evaluation would come to the same conclusions as the summative online evaluation and the summative offline evaluation to the same conclusions as the online summative evaluation conducted 7 days later. Another point of interest was if the formative evaluation would influence the offline summative evaluation.
Participants were students enrolled in two different courses. In the pilot study (n=26), 20 students (response rate 77%) participated in both the summative and formative evaluation. 21 participants (response rate 81%) also filled in the official summative online evaluation. In the experimental study (n=40), 20 students were chosen to take part in the formative evaluation (experimental group). 19 of them (response rate 95%) participated in the formative evaluation and 15 (response rate 75%) took part in the summative evaluation at the end of the term. 25 participants (response rate 63%) also filled in the official summative online evaluation.
Students rated the evaluation via Twitter as useful. Both teachers and students profited from this approach. Because of Twitters simple use and the electronic data handling, there was only little administrative effort. We found that formative and summative evaluations did not come to the same conclusions. T-tests between the offline summative evaluation and the online summative evaluation revealed no differences. Also, there were no differences between the control and the experimental group regarding the summative offline evaluation, which indicates that the formative evaluation had no impact on the summative offline evaluation.
Conference homepage (abstract)
Web survey bibliography (317)
- Overview: Online Surveys; 2017; Vehovar, V.; Lozar Manfreda, K.
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Collecting Data from mHealth Users via SMS Surveys: A Case Study in Kenya; 2016; Johnson, D.
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Identifying Pertinent Variables for Nonresponse Follow-Up Surveys. Lessons Learned from 4 Cases in Switzerland...; 2016; Vandenplas, C.; Joye, D.; Staehli, M. E.; Pollien, A.
- The 2013 Census Test: Piloting Methods to Reduce 2020 Census Costs; 2016; Walejko, G. K.; Miller, P. V.
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Methods can matter: Where Web surveys produce different results than phone interviews; 2016; Keeter, S.
- Do Polls Still Work If People Don't Answer Their Phones?; 2016; Edwards-Levy, A.; Jackson, N. M.
- HUFFPOLLSTER: Why Reaching Latinos Is A Challenge For Pollsters; 2016; Jackson, N. M.; Edwards-Levy, A.; Velencia, J.
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- An Overview of Mobile CATI Issues in Europe; 2015; Slavec, A.; Toninelli, D.
- Using Mobile Phones for High-Frequency Data Collection; 2015; Azevedo, J. P.; Ballivian, A.; Durbin, W.
- Mixed mode surveys ; 2015; Burton, J.
- Two Are Better Than One: The Use of a Mixed-Mode Data Collection to Improve the Electoral Forecast; 2014; de Rada, V. D., Pasadas del Amo, S.
- The impact of contact effort on mode-specific selection and measurement bias; 2014; Schouten, B., van der Laan, J., Cobben, F.
- How much is shorter CAWI questionnaire VS CATI questionnaire?; 2014; Bartoli, B.
- Advantages of a global multimodal print & digital readership survey; 2013; Cour, N., Saint-Joanis, G.
- Relative Mode Effects on Data Quality in Mixed-Mode Surveys by an Instrumental Variable; 2013; Vannieuwenhuyze, J. T. A., Revilla, M.
- A report on the Confirmit Market Research Software Survey 2013; 2013; Macer, T., Wilson, S.
- Mode effect analysis and adjustment in a split-sample mixed-mode Web/CATI survey; 2013; Kolenikov, S., Kennedy, C.
- Evaluating the left‐right dimension: Category Selection Probing conducted in an online access...; 2013; Huefken , V.
- Methodological, legal and technical perspectives on the feasibility of web survey paradata in German...; 2013; Sattelberger, S.
- Impact of mode design on reliability in longitudinal data; 2013; Cernat, A.
- Exploring patterns of academic usage: A Google Scholar based study of ESS, EVS, WVS and ISSP academic...; 2013; Malnar, B.
- Web questionnaires in official population surveys: Do's and don'ts First experiments and impacts...; 2013; Blanke, K.
- Mode effects in Labour Force Surveys - do they really matter?; 2013; Koerner, T.
- Measuring the same concepts in several modes in the "BIBB/BAuA-Employee-Survey 2011/12" ; 2013; Gensicke, M., Tschersich, N., Hartmann, J.
- What works? Getting the General Population To Go Online in a Mixed Mode Local Health Survey; 2013; Frigault, L.-R., Azzou, S. A. K., Molloy, E. J. K., Ammarguellat, F., Couture, M., Gratton, J.
- Using Technology to Conduct Questionnaire Evaluations with Hard to Reach Populations ; 2013; Ridolfo, H., Ott, K.
- Mode Effects in a National Establishment Survey; 2013; Daley, K., Phillips, B. T.
- Evaluating the Effect of a Non-Monetary Incentive in a Nationally Representative Mixed-Mode Establishment...; 2013; Sengupta, M., Harris-Kojetin, L., Hobbs, M., Greene, A.
- Survey Reminder Method Experiment: An Examination of Cost Efficiency and Reminder Mode Salience in the...; 2013; Anderson, M., Rogers, B., CyBulski, K., Hall, J. W., Alderks, C. E., Milazzo-Sayre, L.
- Experiences from a probability-based Internet panel: Sample, recruitment and participation; 2013; Scherpenzeel, A.
- An Evaluation of Internet Versus Paper-based Methods for Public Participation Geographic Information...; 2012; Pocewicz, A.; Nielsen-Pincus, M.; Brown, G.; Schnitzer, R.
- Using paradata to explore item-level response times in surveys; 2012; Couper, M. P., Kreuter, F.
- Specialized Tools for Measuring Past Events ; 2012; Belli, R. F.
- Modes of Data Collection; 2012; Tourangeau, R.
- Mode and non-response effects and their treatment; 2012; Chrysanthopoulos, S., Georgostathi, A.
- “I think I know what you did last summer” Improving data quality in panel surveys; 2012; Lugtig, P. J.
- Using Text-to-Speech (TTS) for Audio-CASI; 2012; Couper, M. P., Kirgis, N., Buageila, S., Berglund, P.
- Does Mode Matter? Initial Evidence from the German Longitudinal Election Study (GLES); 2012; Blumenstiel, J. E., Rossmann, J.
- The Representativity of Web Surveys of the General Population compared to Traditional Modes and Mixed...; 2012; Klausch, L. T., Schouten, B., Hox, J.
- Effects of speeding on satisficing in Mixed-Mode Surveys; 2011; Bathelt, S., Bauknecht, J.
- Web based CATI on Amazon Elastic Compute Cloud and VirtualBox using queXS; 2011; Zammit, A.
- Web/Cloud Based CATI Using queXS; 2011; Zammit, A.
- When Referring to Mode, Is Expressed Preference the Same as Reality?; 2011; Denk, K.
- Three Era's of Survey Research; 2011; Groves, R. M.
- Testing a single mode vs a mixed mode design; 2011; Laaksonen, S.